As part of our team, you will take on the following responsibilities: You develop and maintain cloud‑based data pipelines and data products specifically for the Finance domain, built on our Snowflake database.You integrate financial and transactional data from various sources into our central Data Platform, optimize ETL processes, and ensure high data quality.You design and build new DataFlows and DataSets for Finance use cases and create as well as manage the corresponding Power BI reports.You collaborate closely with Finance Product Owners, Data Scientists, and Business Units to deliver meaningful analytical data products.You take ownership of data governance for all Finance data products and actively drive their further development.You support the modernization of our Finance data architecture and contribute to the migration towards cloud‑ and big‑data‑based technologies. What makes you stand out You hold a degree in Computer Science, Business Informatics, or a comparable qualification.You have strong expertise in SQL databases, including data modeling, table design, and data querying — ideally with experience in Snowflake.You bring experience in developing and integrating data products and are familiar with Azure Data Factory, Python, and modern cloud technologies (preferably Azure).You have hands‑on experience creating meaningful Power BI reports; knowledge of CI/CD or container technologies (e.g., Docker, Kubernetes) is a plus.You work analytically, communicate effectively, collaborate well in teams, and demonstrate a strong hands‑on mentality.You are fluent in English and have very good German skills; you also have a passion for financial databases and enjoy exploring new topics.
Apply statistical methods to detect anomalies, trends and shifts. Your Profile BSc/MSC/Ph.D Degree in Mathematics, Computer Science, Information Technology, Physics or Engineering.At least 5 years experience in the field of data analytics, data mining.Technical expert in the field of data analytics, data mining.Proficient in Python and SQL languages.Experienced with statistical modelling and data mining techniques.Strong problem-solving skills with ability to multi-task and manage multiple projects simultaneously.
Main Responsibilities: Work directly with Sales Engineers, Product Sales Development Managers, and Sales ManagersAnalyze sales data and pricing trends to support improvements in market‑specific pricingTrack online pricing for vacuum pump technology to assess competitive positioningDesign and generate reports, dashboards, and visualizations for the management teamAttend sales meetings, business functions, service calls, and customer visits alongside account managers and mentorsSupport development of internal data strategies to drive pricing and market‑focused sales initiativesBuild an appreciation for how data influences direct and indirect sales and marketing strategiesPerform other related duties as assignedTo succeed, you will need Skills / Knowledge / Experience: Education level: Must be a Junior or Senior majoring in: Computer ScienceManagement Information SystemsBusiness / Marketing AnalyticsIndustrial Engineering Knowledge areas/Skills: Strong analytical and creative problem‑solving skillsProficiency in Excel and/or Python; Tableau or PowerBI experience preferredSelf‑motivated, independent, flexible, organized, and methodicalResults‑driven, accountable, and ambitiousAdaptability in a dynamic, fast‑paced environmentIn return, we offer We believe there is always a better way.
Review operational costs, negotiate contracts with vendors, and manage vendor relationships. Requirements: Bachelor's degree in Engineering, Computer Science, or a related field. HV Authorised Person (Experienced with HV Systems) Electrical/Mechanical Engineering HNC or HND (Successfully completed apprenticeship in either) C&G Pts. 1 & 2, equivalent or exceeds. 17th Edition IEE: Wiring and Installation (Ability to attain 18th Edition through additional training) C&G 2391 test and inspection; BS 7671:2001 for inspection, testing and certification.
What You’ll Do: Collaborate in an Agile, International TeamWork closely with colleagues from Romania, Germany, and UkraineDesign, estimate, develop, and implement software solutions aligned with business needsActively communicate progress, risks, and technical decisions to stakeholdersBuild Scalable Data SolutionsDevelop agnostic data products within a modern, cloud-native data ecosystemSupport use cases across BI, Advanced Analytics, AI, and MLTranslate business requirements into robust technical architecturesContinuously enhance performance, quality, and cost-efficiency of solutionsProactively suggest improvements and best practices What makes you stand out Degree in Computer Science, Economics, or a comparable qualificationMinimum 3 years of experience as a BI Engineer or Data Engineer, focused on cloud-based architecturesStrong expertise in: Snowflake and DBT (Data Build Tool)Solid knowledge of: SQL and Data lakehouse architectures, Python is nice to haveCommunication is Key Excellent communication skills in English (written and spoken) — mandatoryAbility to clearly explain technical concepts to both technical and non-technical stakeholdersStrong stakeholder management and collaboration skillsComfortable working in cross-border, multicultural teams We are looking forward to your application and to applicants who enrich our diverse culture!
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Tätigkeiten: Potenziale erkennen: Analysiere Kundenanforderungen und entwickle datengetriebene Lösungen – von Analytics über Machine Learning bis GenAI.Strategien entwickeln: Konzipiere individuelle Data-&-AI-Strategien für vielfältige Use Cases.Lösungen skalieren: Entwirf zukunftsfähige Architekturen für Datenplattformen, ML-Pipelines und AI-Systeme.Technologien prüfen: Teste und empfehle passende Technologien, begleite Umsetzungsteams mit klaren Roadmaps.Business ausbauen: Unterstütze Pre-Sales-Aktivitäten, Marktanalysen und die Weiterentwicklung datenbasierter Geschäftsbereiche.Angebote mitgestalten: Bringe deine Ideen in Lösungskonzepte, Kalkulationen und Präsentationen ein.Akademischer Hintergrund: Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik, Computer Science oder eine vergleichbare Qualifikation.Erfahrung: Mehrjährige Erfahrung in der Konzeption und Umsetzung moderner Datenplattformen, ML-Lösungen und AI-Systeme.Technologiekompetenz: Fundierte Erfahrung mit Azure, Databricks oder AWS sowie in Python, SQL, Docker, CI/CD-Pipelines und Infrastructure as Code.Fachliches Know-how: Tiefes Verständnis moderner Datenarchitekturen, MLOps und Data Governance.Arbeitsweise: Kombination aus konzeptioneller Stärke, analytischem Denken und pragmatischer Hands-on-Mentalität.Kommunikationsstärke: Fähigkeit, komplexe Themen klar und überzeugend zu vermitteln – in Deutsch und Englisch.Flexibilität: Wir bieten dir flexible Arbeitsmodelle mit großzügiger Gleitzeitregelung – wahlweise an einem unserer Standorte oder bis zu 100 % mobil innerhalb Deutschlands.Einarbeitung & Entwicklung: Deine individuelle Einarbeitung und Karriereförderung liegen uns am Herzen.